266 research outputs found

    Does the universe in fact contain almost no information?

    Get PDF
    At first sight, an accurate description of the state of the universe appears to require a mind-bogglingly large and perhaps even infinite amount of information, even if we restrict our attention to a small subsystem such as a rabbit. In this paper, it is suggested that most of this information is merely apparent, as seen from our subjective viewpoints, and that the algorithmic information content of the universe as a whole is close to zero. It is argued that if the Schr\"odinger equation is universally valid, then decoherence together with the standard chaotic behavior of certain non-linear systems will make the universe appear extremely complex to any self-aware subsets that happen to inhabit it now, even if it was in a quite simple state shortly after the big bang. For instance, gravitational instability would amplify the microscopic primordial density fluctuations that are required by the Heisenberg uncertainty principle into quite macroscopic inhomogeneities, forcing the current wavefunction of the universe to contain such Byzantine superpositions as our planet being in many macroscopically different places at once. Since decoherence bars us from experiencing more than one macroscopic reality, we would see seemingly complex constellations of stars etc, even if the initial wavefunction of the universe was perfectly homogeneous and isotropic.Comment: 17 pages, LATeX, no figures. Online with refs at http://astro.berkeley.edu/~max/nihilo.html (faster from the US), from http://www.mpa-garching.mpg.de/~max/nihilo.html (faster from Europe) or from [email protected]

    Research and applications: Artificial intelligence

    Get PDF
    The program is reported for developing techniques in artificial intelligence and their application to the control of mobile automatons for carrying out tasks autonomously. Visual scene analysis, short-term problem solving, and long-term problem solving are discussed along with the PDP-15 simulator, LISP-FORTRAN-MACRO interface, resolution strategies, and cost effectiveness

    Computational and Biological Analogies for Understanding Fine-Tuned Parameters in Physics

    Full text link
    In this philosophical paper, we explore computational and biological analogies to address the fine-tuning problem in cosmology. We first clarify what it means for physical constants or initial conditions to be fine-tuned. We review important distinctions such as the dimensionless and dimensional physical constants, and the classification of constants proposed by Levy-Leblond. Then we explore how two great analogies, computational and biological, can give new insights into our problem. This paper includes a preliminary study to examine the two analogies. Importantly, analogies are both useful and fundamental cognitive tools, but can also be misused or misinterpreted. The idea that our universe might be modelled as a computational entity is analysed, and we discuss the distinction between physical laws and initial conditions using algorithmic information theory. Smolin introduced the theory of "Cosmological Natural Selection" with a biological analogy in mind. We examine an extension of this analogy involving intelligent life. We discuss if and how this extension could be legitimated. Keywords: origin of the universe, fine-tuning, physical constants, initial conditions, computational universe, biological universe, role of intelligent life, cosmological natural selection, cosmological artificial selection, artificial cosmogenesis.Comment: 25 pages, Foundations of Science, in pres

    Towards a Universal Theory of Artificial Intelligence based on Algorithmic Probability and Sequential Decision Theory

    Get PDF
    Decision theory formally solves the problem of rational agents in uncertain worlds if the true environmental probability distribution is known. Solomonoff's theory of universal induction formally solves the problem of sequence prediction for unknown distribution. We unify both theories and give strong arguments that the resulting universal AIXI model behaves optimal in any computable environment. The major drawback of the AIXI model is that it is uncomputable. To overcome this problem, we construct a modified algorithm AIXI^tl, which is still superior to any other time t and space l bounded agent. The computation time of AIXI^tl is of the order t x 2^l.Comment: 8 two-column pages, latex2e, 1 figure, submitted to ijca

    Self-Referential Noise and the Synthesis of Three-Dimensional Space

    Get PDF
    Generalising results from Godel and Chaitin in mathematics suggests that self-referential systems contain intrinsic randomness. We argue that this is relevant to modelling the universe and show how three-dimensional space may arise from a non-geometric order-disorder model driven by self-referential noise.Comment: Figure labels correcte

    Amplification by stochastic interference

    Full text link
    A new method is introduced to obtain a strong signal by the interference of weak signals in noisy channels. The method is based on the interference of 1/f noise from parallel channels. One realization of stochastic interference is the auditory nervous system. Stochastic interference may have broad potential applications in the information transmission by parallel noisy channels

    The Computational Complexity of Symbolic Dynamics at the Onset of Chaos

    Full text link
    In a variety of studies of dynamical systems, the edge of order and chaos has been singled out as a region of complexity. It was suggested by Wolfram, on the basis of qualitative behaviour of cellular automata, that the computational basis for modelling this region is the Universal Turing Machine. In this paper, following a suggestion of Crutchfield, we try to show that the Turing machine model may often be too powerful as a computational model to describe the boundary of order and chaos. In particular we study the region of the first accumulation of period doubling in unimodal and bimodal maps of the interval, from the point of view of language theory. We show that in relation to the ``extended'' Chomsky hierarchy, the relevant computational model in the unimodal case is the nested stack automaton or the related indexed languages, while the bimodal case is modeled by the linear bounded automaton or the related context-sensitive languages.Comment: 1 reference corrected, 1 reference added, minor changes in body of manuscrip

    Algorithmic statistics revisited

    Full text link
    The mission of statistics is to provide adequate statistical hypotheses (models) for observed data. But what is an "adequate" model? To answer this question, one needs to use the notions of algorithmic information theory. It turns out that for every data string xx one can naturally define "stochasticity profile", a curve that represents a trade-off between complexity of a model and its adequacy. This curve has four different equivalent definitions in terms of (1)~randomness deficiency, (2)~minimal description length, (3)~position in the lists of simple strings and (4)~Kolmogorov complexity with decompression time bounded by busy beaver function. We present a survey of the corresponding definitions and results relating them to each other

    AGI and the Knight-Darwin Law: why idealized AGI reproduction requires collaboration

    Get PDF
    Can an AGI create a more intelligent AGI? Under idealized assumptions, for a certain theoretical type of intelligence, our answer is: “Not without outside help”. This is a paper on the mathematical structure of AGI populations when parent AGIs create child AGIs. We argue that such populations satisfy a certain biological law. Motivated by observations of sexual reproduction in seemingly-asexual species, the Knight-Darwin Law states that it is impossible for one organism to asexually produce another, which asexually produces another, and so on forever: that any sequence of organisms (each one a child of the previous) must contain occasional multi-parent organisms, or must terminate. By proving that a certain measure (arguably an intelligence measure) decreases when an idealized parent AGI single-handedly creates a child AGI, we argue that a similar Law holds for AGIs

    Artificial Sequences and Complexity Measures

    Get PDF
    In this paper we exploit concepts of information theory to address the fundamental problem of identifying and defining the most suitable tools to extract, in a automatic and agnostic way, information from a generic string of characters. We introduce in particular a class of methods which use in a crucial way data compression techniques in order to define a measure of remoteness and distance between pairs of sequences of characters (e.g. texts) based on their relative information content. We also discuss in detail how specific features of data compression techniques could be used to introduce the notion of dictionary of a given sequence and of Artificial Text and we show how these new tools can be used for information extraction purposes. We point out the versatility and generality of our method that applies to any kind of corpora of character strings independently of the type of coding behind them. We consider as a case study linguistic motivated problems and we present results for automatic language recognition, authorship attribution and self consistent-classification.Comment: Revised version, with major changes, of previous "Data Compression approach to Information Extraction and Classification" by A. Baronchelli and V. Loreto. 15 pages; 5 figure
    • …
    corecore